2 research outputs found

    Perception of perspective in augmented reality head-up displays

    No full text
    Augmented Reality (AR) is emerging fast with a wide range of applications, including automotive AR Head-Up Displays (AR HUD). As a result, there is a growing need to understand human perception of depth in AR. Here, we discuss two user studies on depth perception, in particular on the perspective cue. The first experiment compares the perception of the perspective depth cue (1) in the physical world, (2) on a flat-screen, and (3) on an AR HUD. Our AR HUD setup provided a two-dimensional vertically oriented virtual image projected at a fixed distance. In each setting, participants were asked to estimate the size of a perspective angle. We found that the perception of angle sizes on AR HUD differs from perception in the physical world, but not from a flat-screen. The underestimation of the physical world's angle size compared to the AR HUD and screen setup might explain the egocentric depth underestimation phenomenon in virtual environments. In the second experiment, we compared perception for different graphical representations of angles that are relevant for practical applications. Graphical alterations of angles displayed on a screen resulted in more variation between individuals' angle size estimations. Furthermore, the majority of the participants tended to underestimate the observed angle size in most conditions. Our results suggest that perspective angles on a vertically oriented fixed-depth AR HUD display mimic more accurately the perception of a screen, rather than the perception of the physical 3D environment. On-screen graphical alteration does not help to improve the underestimation in the majority of cases
    corecore